Skip to content

Fix multimodal warmup #1465

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: habana_main
Choose a base branch
from
Open

Conversation

adobrzyn
Copy link

No description provided.

Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
@adobrzyn
Copy link
Author

/run-gaudi-tests

Signed-off-by: Agata Dobrzyniewicz <adobrzyniewicz@habana.ai>
@madamczyk-intel madamczyk-intel requested a review from Copilot June 23, 2025 08:23
Copy link

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR aims to fix the multimodal warmup process by removing the available_mem parameter and its related memory check logic to adjust how memory usage is accounted during warmup.

  • Removed the available_mem parameter from _warmup_multimodal_graph.
  • Eliminated the memory check based on available_mem and its decrement logic during warmup.
Comments suppressed due to low confidence (2)

vllm/worker/hpu_model_runner.py:2763

  • The available_mem parameter was removed from _warmup_multimodal_graph. Please confirm that any necessary memory limiting logic is now handled elsewhere and update related documentation or comments accordingly.
                                 kv_caches,

vllm/worker/hpu_model_runner.py:2793

  • By removing the decrement of available_mem, the memory usage accounting strategy has changed. Ensure that this adjustment is intentional and that any required memory usage checks are implemented elsewhere in the code.
                                     torch.distributed.ReduceOp.MAX)

@adobrzyn
Copy link
Author

/run-gaudi-tests

1 similar comment
@adobrzyn
Copy link
Author

/run-gaudi-tests

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants